Goto

Collaborating Authors

 agnostic pac learner




Reviews: Model-Agnostic Private Learning

Neural Information Processing Systems

The paper considers a new differentially private learning setting, that receives a collection of unlabeled public data, on top of the labelled private data of interests and assumes that the two data sets are drawn from the same distribution. The proposed technique allows the use of (non-private) agnostic PAC learners as black boxes oracles, which, when combining with and adapts to the structure of the data sets. The idea is summarized below: 1. Do differentially private model-serving in a data-adaptive fashion, through sparse vector'' technique and subsample-and-aggregate''. This only handles a finite number of classification queries. It behaves similarly to 1 using the properties of an agnostic PAC learner, but can now handle an unbounded number of classification queries.


On the Learnability of Multilabel Ranking

Raman, Vinod, Subedi, Unique, Tewari, Ambuj

arXiv.org Artificial Intelligence

Multilabel ranking is a central task in machine learning. However, the most fundamental question of learnability in a multilabel ranking setting with relevance-score feedback remains unanswered. In this work, we characterize the learnability of multilabel ranking problems in both batch and online settings for a large family of ranking losses. Along the way, we give two equivalence classes of ranking losses based on learnability that capture most, if not all, losses used in practice.